Building Recurrent Neural Networks to Implement Multiple Attractor Dynamics Using the Gradient Descent Method
نویسندگان
چکیده
The present paper proposes a recurrent neural network model and learning algorithm that can acquire the ability to generate desired multiple sequences. The network model is a dynamical system in which the transition function is a contraction mapping, and the learning algorithm is based on the gradient descent method. We show a numerical simulation in which a recurrent neural network obtains a multiple periodic attractor consisting of five Lissajous curves, or a Van der Pol oscillator with twelve different parameters. The present analysis clarifies that the model contains many stable regions as attractors, and multiple time series can be embedded into these regions by using the present learning method.
منابع مشابه
A Gradient Descent Method for a Neural Fractal Memory
It has been demonstrated that higher order recurrent neural networks exhibit an underlying fractal attractor as an artifact of their dynamics. These fractal attractors o er a very e cent mechanism to encode visual memories in a neural substrate, since even a simple twelve weight network can encode a very large set of di erent images. The main problem in this memory model, which so far has remai...
متن کاملHandwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns
The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...
متن کاملA Gradient Descent Method for a Neural
| It has been demonstrated that higher order recurrent neu-ral networks exhibit an underlying fractal attractor as an artifact of their dynamics. These fractal attractors ooer a very eecent mechanism to encode visual memories in a neu-ral substrate, since even a simple twelve weight network can encode a very large set of diierent images. The main problem in this memory model, which so far has r...
متن کاملAn Investigation of the Gradient Descent Process in Neural Networks
Usually gradient descent is merely a way to find a minimum, abandoned if a more efficient technique is available. Here we investigate the detailed properties of the gradient descent process, and the related topics of how gradients can be computed, what the limitations on gradient descent are, and how the second-order information that governs the dynamics of gradient descent can be probed. To de...
متن کاملA Biological Gradient Descent for Prediction Through a Combination of STDP and Homeostatic Plasticity
Identifying, formalizing, and combining biological mechanisms that implement known brain functions, such as prediction, is a main aspect of research in theoretical neuroscience. In this letter, the mechanisms of spike-timing-dependent plasticity and homeostatic plasticity, combined in an original mathematical formalism, are shown to shape recurrent neural networks into predictors. Following a r...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Adv. Artificial Neural Systems
دوره 2009 شماره
صفحات -
تاریخ انتشار 2009